16 research outputs found

    Joint Doctrine Ontology: A Benchmark for Military Information Systems Interoperability

    Get PDF
    When the U.S. conducts warfare, elements of a force are drawn from different services and work together as a single team to accomplish an assigned mission. To achieve such unified action, it is necessary that the doctrines governing the actions of members of specific services be both consistent with and subservient to joint Doctrine. Because warfighting today increasingly involves not only live forces but also automated systems, unified action requires that information technology that is used in joint warfare must be aligned with joint doctrine. It requires also that the separate information systems used by the different elements of a joint force must be interoperable, in the sense that data and information that is generated by each element must be usable (understandable, processable) by all the other elements that need them. Currently, such interoperability is impeded by multiple inconsistencies among the different data and software standards used by warfighters. We describe here the on-going project of creating a Joint Doctrine Ontology (JDO), which uses joint doctrine to provide shared computer-accessible content valid for any field of military endeavor, organization, and information system. JDO addresses the two previously mentioned requirements of unified action by providing a widely applicable benchmark for use by developers of information systems that will both guarantee alignment with joint doctrine and support interoperability

    An Ontological Approach to Representing the Product Life Cycle

    Get PDF
    The ability to access and share data is key to optimizing and streamlining any industrial production process. Unfortunately, the manufacturing industry is stymied by a lack of interoperability among the systems by which data are produced and managed, and this is true both within and across organizations. In this paper, we describe our work to address this problem through the creation of a suite of modular ontologies representing the product life cycle and its successive phases, from design to end of life. We call this suite the Product Life Cycle (PLC) Ontologies. The suite extends proximately from The Common Core Ontologies (CCO) used widely in defense and intelligence circles, and ultimately from the Basic Formal Ontology (BFO), which serves as top level ontology for the CCO and for some 300 further ontologies. The PLC Ontologies were developed together, but they have been factored to cover particular domains such as design, manufacturing processes, and tools. We argue that these ontologies, when used together with standard public domain alignment and browsing tools created within the context of the Semantic Web, may offer a low-cost approach to solving increasingly costly problems of data management in the manufacturing industry

    IAO-Intel: An Ontology of Information Artifacts in the Intelligence Domain

    Get PDF
    We describe on-going work on IAO-Intel, an information artifact ontology developed as part of a suite of ontologies designed to support the needs of the US Army intelligence community within the framework of the Distributed Common Ground System (DCGS-A). IAO-Intel provides a controlled, structured vocabulary for the consistent formulation of metadata about documents, images, emails and other carriers of information. It will provide a resource for uniform explication of the terms used in multiple existing military dictionaries, thesauri and metadata registries, thereby enhancing the degree to which the content formulated with their aid will be available to computational reasoning

    Integration of Transcriptomics, Proteomics, and MicroRNA Analyses Reveals Novel MicroRNA Regulation of Targets in the Mammalian Inner Ear

    Get PDF
    We have employed a novel approach for the identification of functionally important microRNA (miRNA)-target interactions, integrating miRNA, transcriptome and proteome profiles and advanced in silico analysis using the FAME algorithm. Since miRNAs play a crucial role in the inner ear, demonstrated by the discovery of mutations in a miRNA leading to human and mouse deafness, we applied this approach to microdissected auditory and vestibular sensory epithelia. We detected the expression of 157 miRNAs in the inner ear sensory epithelia, with 53 miRNAs differentially expressed between the cochlea and vestibule. Functionally important miRNAs were determined by searching for enriched or depleted targets in the transcript and protein datasets with an expression consistent with the dogma of miRNA regulation. Importantly, quite a few of the targets were detected only in the protein datasets, attributable to regulation by translational suppression. We identified and experimentally validated the regulation of PSIP1-P75, a transcriptional co-activator previously unknown in the inner ear, by miR-135b, in vestibular hair cells. Our findings suggest that miR-135b serves as a cellular effector, involved in regulating some of the differences between the cochlear and vestibular hair cells

    Graph Data Structures presentation

    No full text
    Ontologies serve multiple roles in knowledge systems including the standardization of different vocabularies within an enterprise, the integration of disparate data sources into a single model, and enabling reasoning over data. The mini-symposium on graph data structures will cover all of these roles but will focus primarily on the use of ontologies as the logical model for graph representations of data. Included in the presentation will be a description of the structure and content of ontologies, an introduction to the languages used to construct ontologies: the Resource Description Framework (RDF), the Resource Description Framework Schema (RDF-S) and the Web Ontology Language (OWL) and a description of some the best practices for building ontologies. Following the presentation part of the symposium, there will be a hands on portion in which participants will be able to use some of the tools for building ontologies, executing logical rules for reasoning, generating ontology aligned data and querying that data once created

    What particulars are referred to in EHR data? A case study in integrating referent tracking into an electronic health record application

    No full text
    Referent Tracking (RT) advocates the use of instance unique identifiers to refer to the entities comprising the subject matter of patient health records. RT promises many benefits to those who use health record data to improve patient care. To further the adoption of the paradigm we provide an illustration of how data from an EHR application needs to be decomposed in order to make it accord with the tenets of RT. We describe the ontological principles on which this decomposition is based in order to allow integration efforts to be applied in similar ways to other EHR applications. We find that an ordinary statement from an EHR contains a surprising amount of “hidden” data that are only revealed by its decomposition according to these principles

    Controlled and uncontrolled English for ontology editing

    Get PDF
    Ontologies formally represent reality in a way that limits ambiguity and facilitates automated reasoning and data fusion, but is often daunting to the non-technical user. Thus, many researchers have endeavored to hide the formal syntax and semantics of ontologies behind the constructs of Controlled Natural Languages (CNLs), which retain the formal properties of ontologies while simultaneously presenting that information in a comprehensible natural language format. In this paper, we build upon previous work in this field by evaluating prospects of implementing International Technology Alliance Controlled English (ITACE) as a middleware for ontology editing. We also discuss at length a prototype of a natural language conversational interface application designed to facilitate ontology editing via the formulation of CNL constructs
    corecore